Sharpening Sparse Regularizers via Smoothing

نویسندگان

چکیده

Non-convex sparsity-inducing penalties outperform their convex counterparts, but generally sacrifice the cost function convexity. As a middle ground, we propose sharpening sparse regularizers (SSR) framework to design non-separable non-convex that induce sparsity more effectively than such as l 1 and nuclear norms, without sacrificing The overall problem convexity is preserved by exploiting data fidelity relative strong constructs difference of functions, namely between smoothed versions. We generalized infimal convolution smoothing technique obtain Furthermore, SSR recovers generalizes several in literature special cases. applicable any regularized least squares ill-posed linear inverse problem. Beyond squares, can be extended accommodate Bregman divergence, other structures low-rankness. optimization formulated saddle point problem, solved scalable forward-backward splitting algorithm. effectiveness demonstrated numerical experiments different applications.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Smoothing Regularizers for Projective Basis Function Networks

Smoothing regularizers for radial basis functions have been studied extensively, but no general smoothing regularizers for projective basis functions (PBFs), such as the widely-used sigmoidal PBFs, have heretofore been proposed. We derive new classes of algebraically-simplemth-order smoothing regularizers for networks of projective basis functions f(W;x) = PNj=1 ujg xTvj + vj0 + u0; with genera...

متن کامل

Robust Discriminative Clustering with Sparse Regularizers

Clustering high-dimensional data often requires some form of dimensionality reduction, where clustered variables are separated from “noise-looking” variables. We cast this problem as finding a low-dimensional projection of the data which is well-clustered. This yields a one-dimensional projection in the simplest situation with two clusters, and extends naturally to a multi-label scenario for mo...

متن کامل

Linear Smoothing Morphological Sharpening + Kuwahara-Nagao Operator

This report sheds some new light on an old and trusted tool in the image processing toolbox. The Kuwahara-Nagao operator is known as an edge preserving smoothing operator. This report shows that we don’t need to trust on our intuition to verify this claim but that it follows from basic principles. To that end we will first cast the classical Kuwahara-Nagao operator into a more modern framework ...

متن کامل

Efficient Sparse Recovery via Adaptive Non-Convex Regularizers with Oracle Property

The main shortcoming of sparse recovery with a convex regularizer is that it is a biased estimator and therefore will result in a suboptimal performance in many cases. Recent studies have shown, both theoretically and empirically, that non-convex regularizer is able to overcome the biased estimation problem. Although multiple algorithms have been developed for sparse recovery with non-convex re...

متن کامل

ADMM for Training Sparse Structural SVMs with Augmented ℓ1 Regularizers

The size |Y | of the output space Y is exponential and optimization over the entire space Y is computationally expensive. Hence in the sequential dual optimization method, the optimization of (A.6) is restricted to the set Yi = {y : αiy > 0} maintained for each example. For clarity, we present the sequential dual optimization method to solve (A.2) in Algorithm 3. The algorithm starts with Yi = ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE open journal of signal processing

سال: 2021

ISSN: ['2644-1322']

DOI: https://doi.org/10.1109/ojsp.2021.3104497